Sharper Bounds for Regression and Low-Rank Approximation with Regularization

نویسندگان

  • Haim Avron
  • Kenneth L. Clarkson
  • David P. Woodruff
چکیده

We study matrix sketching methods for regularized variants of linear regression, low rank approximation, and canonical correlation analysis. Our main focus is on sketching techniques which preserve the objective function value for regularized problems, which is an area that has remained largely unexplored. We study regularization both in a fairly broad setting, and in the specific context of the popular and widely used technique of ridge regularization; for the latter, as applied to each of these problems, we show algorithmic resource bounds in which the statistical dimension appears in places where in previous bounds the rank would appear. The statistical dimension is always smaller than the rank, and decreases as the amount of regularization increases. In particular, for the ridge low-rank approximation problem minY,X‖Y X−A‖F+ λ‖Y ‖F +λ‖X‖F , where Y ∈ R and X ∈ R, we give an approximation algorithm needing O(nnz(A)) + Õ((n+ d)ε−1kmin{k, ε sdλ(Y ∗)}) + poly(sdλ(Y )ǫ) time, where sλ(Y ) ≤ k is the statistical dimension of Y , Y ∗ is an optimal Y , ε is an error parameter, and nnz(A) is the number of nonzero entries of A. This is faster than prior work, even when λ = 0. We also study regularization in a much more general setting. For example, we obtain sketchingbased algorithms for the low-rank approximation problem minX,Y ‖Y X −A‖F + f(Y,X) where f(·, ·) is a regularizing function satisfying some very general conditions (chiefly, invariance under orthogonal transformations). Tel Aviv University IBM Research Almaden IBM Research Almaden

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Sharper Bounds for Regularized Data Fitting

We study matrix sketching methods for regularized variants of linear regression, low rank approximation, and canonical correlation analysis. Our main focus is on sketching techniques which preserve the objective function value for regularized problems, which is an area that has remained largely unexplored. We study regularization both in a fairly broad setting, and in the specific context of th...

متن کامل

On the Impact of Kernel Approximation on Learning Accuracy

Kernel approximation is commonly used to scale kernel-based algorithms to applications containing as many as several million instances. This paper analyzes the effect of such approximations in the kernel matrix on the hypothesis generated by several widely used learning algorithms. We give stability bounds based on the norm of the kernel approximation for these algorithms, including SVMs, KRR, ...

متن کامل

Random Projections for Low Multilinear Rank Tensors

We proposed two randomized tensor algorithms for reducing multilinear ranks in the Tucker format. The basis of these randomized algorithms is from the randomized SVD of Halko, Martinsson and Tropp [9]. Here we provide randomized versions of the higher order SVD and higher order orthogonal iteration. Moreover, we provide a sharper probabilistic error bounds for the matrix low rank approximation....

متن کامل

Ridge Regression and Provable Deterministic Ridge Leverage Score Sampling

Ridge leverage scores provide a balance between low-rank approximation and regularization, and are ubiquitous in randomized linear algebra and machine learning. Deterministic algorithms are also of interest in the moderately big data regime, because deterministic algorithms provide interpretability to the practitioner by having no failure probability and always returning the same results. We pr...

متن کامل

Learning low-rank output kernels

Output kernel learning techniques allow to simultaneously learn a vector-valued function and a positive semidefinite matrix which describes the relationships between the outputs. In this paper, we introduce a new formulation that imposes a low-rank constraint on the output kernel and operates directly on a factor of the kernel matrix. First, we investigate the connection between output kernel l...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • CoRR

دوره abs/1611.03225  شماره 

صفحات  -

تاریخ انتشار 2016